Distant generalization by feedforward neural networks
نویسنده
چکیده
This paper discusses the notion of generalization of training samples over long distances in the input space of a feedforward neural network. Such a generalization might occur in various ways, that di er in how great the contribution of di erent training features should be. The structure of a neuron in a feedforward neural network is analyzed and it is concluded, that the actual performance of the discussed generalization in such neural networks may be problematic { while such neural networks might be capable for such a distant generalization, a random and spurious generalization may occur as well. To illustrate the di erences in generalizing of the same function by different learning machines, results given by the support vector machines are also presented. keywords: supervised learning, generalization, feedforward neural network, support vector machine
منابع مشابه
Solving the dilemma of long range generalization by feedforward neural networks
This paper discusses the problem of generalization of training samples over long distances in the input space of a feedforward learning machine. Such a generalization type might produce a significant dilemma of how great the contribution of different training samples should be in generalizing of that input space. A structure of a neuron in a feedforward neural network is analyzed and it is conc...
متن کاملOn Interference of Signals and Generalization in Feedforward Neural Networks
This paper studies how the generalization ability of neurons can be affected by mutual processing of different signals. This study is done on the basis of a feedforward artificial neural network, that is used here as a model of the very basic processes in a network of biological neurons. The mutual processing of signals, called here an interference of signals, can possibly be a good model of pa...
متن کاملA Case Study on Stacked Generalization with Software Reliability Growth Modeling Data
We study on stacked generalization performance with software reliability growth data by using a pseudoinverse learning algorithm for feedforward neural networks. The experiments show that for noisy data, using stacked generalization can not improve the network performance when overtrained networks are engaged. With properly trained networks, stacked generalization can improve the network genera...
متن کاملCorruption of Generalizing Signals in Densely Connected Feedforward Neural Networks with Hyperbolic Tangent Activation Functions
This paper discusses the propagation of signals in generic densely connected multilayered feedforward neural networks. It is concluded that the dense connecting combined with the hyperbolic tangent activation functions of the neurons may cause a highly random, spurious generalization, that decreases the overall performance and reliability of a neural network and can be mistaken with overfitting...
متن کاملA PAC-Bayesian Approach to Spectrally-Normalized Margin Bounds for Neural Networks
We present a generalization bound for feedforward neural networks in terms of the product of the spectral norms of the layers and the Frobenius norm of the weights. The generalization bound is derived using a PAC-Bayes analysis.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/cs/0505021 شماره
صفحات -
تاریخ انتشار 2005